Agnostic Online learnability

نویسنده

  • Shai Shalev-Shwartz
چکیده

We study a fundamental question. What classes of hypotheses are learnable in the online learning model? The analogous question in the PAC learning model [12] was addressed by Vapnik and others [13], who showed that the VC dimension characterizes the learnability of a hypothesis class. In his influential work, Littlestone [9] studied the online learnability of hypothesis classes, but only in the realizable case, namely, assuming that there exists a hypothesis in the class that perfectly explains the entire data. In this paper we study the online learnability in the agnostic case, namely, no hypothesis perfectly predicts the entire data, and our goal is to minimize regret. We first present an impossibility result, discovered by Cover in the context of universal prediction of individual sequences, which implies that even a class whose Littlestone’s dimension is only 1, is not learnable in the agnostic online learning model. We then overcome the impossibility result by allowing randomized predictions, and show that in this case Littlestone’s dimension does capture the learnability of hypotheses classes in the agnostic online learning model.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Agnostic Online Learning

We study learnability of hypotheses classes in agnostic online prediction models. The analogous question in the PAC learning model [Valiant, 1984] was addressed by Haussler [1992] and others, who showed that the VC dimension characterization of the sample complexity of learnability extends to the agnostic (or ”unrealizable”) setting. In his influential work, Littlestone [1988] described a combi...

متن کامل

(Agnostic) PAC Learning Concepts in Higher-Order Logic

This paper studies the PAC and agnostic PAC learnability of some standard function classes in the learning in higher-order logic setting introduced by Lloyd et al. In particular, it is shown that the similarity between learning in higher-order logic and traditional attributevalue learning allows many results from computational learning theory to be ‘ported’ to the logical setting with ease. As ...

متن کامل

Differential Privacy and the Fat-Shattering Dimension of Linear Queries

In this paper, we consider the task of answering linear queries under the constraint of differential privacy. This is a general and wellstudied class of queries that captures other commonly studied classes, including predicate queries and histogram queries. We show that the accuracy to which a set of linear queries can be answered is closely related to its fat-shattering dimension, a property t...

متن کامل

Some contributions to fixed-distribution learning theory

In this paper, we consider some problems in learning with respect to a fixed distribution. We introduce two new notions of learnability; these are probably uniformly approximately correct (PUAC) learnability which is a stronger requirement than the widely studied PAC learnability, and minimal empirical risk (MER) learnability, which is a stronger requirement than the previously defined notions ...

متن کامل

Deriving Learnability Heuristics for Online Educational Courseware Systems - The First Stage

Over recent years, more and more online coursewares have been released to facilitate people’s online learning experience. Although these courseware systems have undergone intensive development over the years, course sites developed from these courseware systems are not userfriendly to students. In addition, the designs of those online educational courseware systems often do not embody particula...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008